Search results for "Lossless compression"

showing 10 items of 21 documents

Improving Lossless Image Compression with Contextual Memory

2019

With the increased use of image acquisition devices, including cameras and medical imaging instruments, the amount of information ready for long term storage is also growing. In this paper we give a detailed description of the state-of-the-art lossless compression software PAQ8PX applied to grayscale image compression. We propose a new online learning algorithm for predicting the probability of bits from a stream. We then proceed to integrate the algorithm into PAQ8PX&rsquo

Computer scienceComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONgeometric weightingData_CODINGANDINFORMATIONTHEORY02 engineering and technologylcsh:TechnologylosslessGrayscale030218 nuclear medicine & medical imagingImage (mathematics)lcsh:Chemistry03 medical and health sciences0302 clinical medicineProbabilistic methodSoftware0202 electrical engineering electronic engineering information engineeringprobabilistic methodGeneral Materials Sciencelcsh:QH301-705.5InstrumentationFluid Flow and Transfer ProcessesLossless compressioncontextual informationlcsh:Tbusiness.industryProcess Chemistry and TechnologyGeneral EngineeringEnsemble learninglcsh:QC1-999image compressionComputer Science ApplicationsTerm (time)lcsh:Biology (General)lcsh:QD1-999Computer engineeringlcsh:TA1-2040ensemble learning020201 artificial intelligence & image processinglcsh:Engineering (General). Civil engineering (General)businesslcsh:PhysicsImage compressionApplied Sciences
researchProduct

Morse Description and Geometric Encoding of Digital Elevation Maps

2004

Two complementary geometric structures for the topographic representation of an image are developed in this work. The first one computes a description of the Morse-topological structure of the image, while the second one computes a simplified version of its drainage structure. The topographic significance of the Morse and drainage structures of digital elevation maps (DEMs) suggests that they can been used as the basis of an efficient encoding scheme. As an application, we combine this geometric representation with an interpolation algorithm and lossless data compression schemes to develop a compression scheme for DEMs. This algorithm achieves high compression while controlling the maximum …

ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONImage processingData_CODINGANDINFORMATIONTHEORYSensitivity and SpecificityPattern Recognition AutomatedPhysics::GeophysicsImaging Three-DimensionalCompression (functional analysis)Image Interpretation Computer-AssistedComputer SimulationComputer visionMorse theoryMathematicsLossless compressionbusiness.industryReproducibility of ResultsNumerical Analysis Computer-AssistedSignal Processing Computer-AssistedData CompressionImage EnhancementTopographic mapComputer Graphics and Computer-Aided DesignArtificial intelligencebusinessAlgorithmAlgorithmsSoftwareData compressionImage compressionInterpolationIEEE Transactions on Image Processing
researchProduct

A WAVELET OPERATOR ON THE INTERVAL IN SOLVING MAXWELL'S EQUATIONS

2011

In this paper, a differential wavelet-based operator defined on an interval is presented and used in evaluating the electromagnetic field described by Maxwell's curl equations, in time domain. The wavelet operator has been generated by using Daubechies wavelets with boundary functions. A spatial differential scheme has been performed and it has been applied in studying electromagnetic phenomena in a lossless medium. The proposed approach has been successfully tested on a bounded axial-symmetric cylindrical domain.

Curl (mathematics)Lossless compressionElectromagnetic PhenomenaWavelet operator Maxwell's equationsMathematical analysisComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONData_CODINGANDINFORMATIONTHEORYPhysics::Classical PhysicsElectronic Optical and Magnetic MaterialsSettore ING-IND/31 - ElettrotecnicaSettore MAT/08 - Analisi Numericasymbols.namesakeWaveletMaxwell's equationsBounded functionsymbolsTime domainMathematicsProgress In Electromagnetics Research Letters
researchProduct

The rightmost equal-cost position problem.

2013

LZ77-based compression schemes compress the input text by replacing factors in the text with an encoded reference to a previous occurrence formed by the couple (length, offset). For a given factor, the smallest is the offset, the smallest is the resulting compression ratio. This is optimally achieved by using the rightmost occurrence of a factor in the previous text. Given a cost function, for instance the minimum number of bits used to represent an integer, we define the Rightmost Equal-Cost Position (REP) problem as the problem of finding one of the occurrences of a factor whose cost is equal to the cost of the rightmost one. We present the Multi-Layer Suffix Tree data structure that, for…

FOS: Computer and information sciencesOffset (computer science)Computer scienceSuffix treeComputer Science - Information Theorylaw.inventionCombinatoricslawLog-log plotComputer Science - Data Structures and AlgorithmsCompression schemetext compressiondictionary text compressionData Structures and Algorithms (cs.DS)LZ77 compressiondata compressionLossless compressionfull text indexSuffix Tree Data StructuresSettore INF/01 - InformaticaInformation Theory (cs.IT)Data structurePrefixCompression ratioCompression scheme; Constant time; Suffix Tree Data StructuresAlgorithmData compressionConstant time
researchProduct

Lossless and near-lossless image compression based on multiresolution analysis

2013

There are applications in data compression, where quality control is of utmost importance. Certain features in the decoded signal must be exactly, or very accurately recovered, yet one would like to be as economical as possible with respect to storage and speed of computation. In this paper, we present a multi-scale data-compression algorithm within Harten's interpolatory framework for multiresolution that gives a specific estimate of the precise error between the original and the decoded signal, when measured in the L"~ and in the L"p (p=1,2) discrete norms. The proposed algorithm does not rely on a tensor-product strategy to compress two-dimensional signals, and it provides a priori bound…

Lossless compressionApplied MathematicsMultiresolution analysisComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONData compression ratioData_CODINGANDINFORMATIONTHEORYLossy compressionPeak signal-to-noise ratioComputational MathematicsQuantization (image processing)AlgorithmMathematicsImage compressionData compressionJournal of Computational and Applied Mathematics
researchProduct

The Engineering of a Compression Boosting Library: Theory vs Practice in BWT Compression

2006

Data Compression is one of the most challenging arenas both for algorithm design and engineering. This is particularly true for Burrows and Wheeler Compression a technique that is important in itself and for the design of compressed indexes. There has been considerable debate on how to design and engineer compression algorithms based on the BWT paradigm. In particular, Move-to-Front Encoding is generally believed to be an "inefficient " part of the Burrows-Wheeler compression process. However, only recently two theoretically superior alternatives to Move-to-Front have been proposed, namely Compression Boosting and Wavelet Trees. The main contribution of this paper is to provide the first ex…

Lossless compressionBoosting (machine learning)Computer sciencebusiness.industrySupervised learningCompression Boosting LibraryData_CODINGANDINFORMATIONTHEORYMachine learningcomputer.software_genreWaveletAlgorithm designArtificial intelligencebusinesscomputerAlgorithmsData compression
researchProduct

From First Principles to the Burrows and Wheeler Transform and Beyond, via Combinatorial Optimization

2007

AbstractWe introduce a combinatorial optimization framework that naturally induces a class of optimal word permutations with respect to a suitably defined cost function taking into account various measures of relatedness between words. The Burrows and Wheeler transform (bwt) (cf. [M. Burrows, D. Wheeler, A block sorting lossless data compression algorithm, Technical Report 124, Digital Equipment Corporation, 1994]), and its analog for labelled trees (cf. [P. Ferragina, F. Luccio, G. Manzini, S. Muthukrishnan, Structuring labeled trees for optimal succinctness, and beyond, in: Proc. of the 45th Annual IEEE Symposium on Foundations of Computer Science, 2005, pp. 198–207]), are special cases i…

Lossless compressionBoosting (machine learning)General Computer ScienceComputer scienceComputationData_CODINGANDINFORMATIONTHEORYLyndon wordOptimal word permutationTheoretical Computer ScienceCombinatoricsPermutationSuffix treeCombinatorial optimizationBurrows–Wheeler transformTime complexityComputer Science(all)
researchProduct

Statistical Modeling of Huffman Tables Coding

2005

An innovative algorithm for automatic generation of Huffman coding tables for semantic classes of digital images is presented. Collecting statistics over a large dataset of corresponding images, we generated Huffman tables for three images classes: landscape, portrait and document. Comparisons between the new tables and the JPEG standard coding tables, using also different quality settings, have shown the effectiveness of the proposed strategy in terms of final bit size (e.g. compression ratio).

Lossless compressionComputer scienceTunstall codingComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONStatistical modelData_CODINGANDINFORMATIONTHEORYcomputer.file_formatHuffman codingcomputer.software_genreJPEGHuffman coding JPEG image compressionDigital imagesymbols.namesakeCanonical Huffman codesymbolsData miningcomputerAlgorithmCoding (social sciences)
researchProduct

Lossless coding of hyperspectral images with principal polynomial analysis

2014

The transform in image coding aims to remove redundancy among data coefficients so that they can be independently coded, and to capture most of the image information in few coefficients. While the second goal ensures that discarding coefficients will not lead to large errors, the first goal ensures that simple (point-wise) coding schemes can be applied to the retained coefficients with optimal results. Principal Component Analysis (PCA) provides the best independence and data compaction for Gaussian sources. Yet, non-linear generalizations of PCA may provide better performance for more realistic non-Gaussian sources. Principal Polynomial Analysis (PPA) generalizes PCA by removing the non-li…

Lossless compressionData compactionbusiness.industryRoundingGaussianDimensionality reductionHyperspectral imagingPattern recognitionsymbols.namesakePrincipal component analysissymbolsEntropy (information theory)Artificial intelligencebusinessMathematics2014 IEEE International Conference on Image Processing (ICIP)
researchProduct

Optimal Partitions of Strings: A New Class of Burrows-Wheeler Compression Algorithms

2003

The Burrows-Wheeler transform [1] is one of the mainstays of lossless data compression. In most cases, its output is fed to Move to Front or other variations of symbol ranking compression. One of the main open problems [2] is to establish whether Move to Front, or more in general symbol ranking compression, is an essential part of the compression process. We settle this question positively by providing a new class of Burrows-Wheeler algorithms that use optimal partitions of strings, rather than symbol ranking, for the additional step. Our technique is a quite surprising specialization to strings of partitioning techniques devised by Buchsbaum et al. [3] for two-dimensional table compression…

Lossless compressionNew classBurrows–Wheeler transformComputer scienceString (computer science)Entropy (information theory)Data_CODINGANDINFORMATIONTHEORYPattern matchingAlgorithmData compression
researchProduct